135 research outputs found

    Stochastic Intermediate Gradient Method for Convex Problems with Inexact Stochastic Oracle

    Full text link
    In this paper we introduce new methods for convex optimization problems with inexact stochastic oracle. First method is an extension of the intermediate gradient method proposed by Devolder, Glineur and Nesterov for problems with inexact oracle. Our new method can be applied to the problems with composite structure, stochastic inexact oracle and allows using non-Euclidean setup. We prove estimates for mean rate of convergence and probabilities of large deviations from this rate. Also we introduce two modifications of this method for strongly convex problems. For the first modification we prove mean rate of convergence estimates and for the second we prove estimates for large deviations from the mean rate of convergence. All the rates give the complexity estimates for proposed methods which up to multiplicative constant coincide with lower complexity bound for the considered class of convex composite optimization problems with stochastic inexact oracle

    Primal-dual accelerated gradient methods with small-dimensional relaxation oracle

    Full text link
    In this paper, a new variant of accelerated gradient descent is proposed. The pro-posed method does not require any information about the objective function, usesexact line search for the practical accelerations of convergence, converges accordingto the well-known lower bounds for both convex and non-convex objective functions,possesses primal-dual properties and can be applied in the non-euclidian set-up. Asfar as we know this is the rst such method possessing all of the above properties atthe same time. We also present a universal version of the method which is applicableto non-smooth problems. We demonstrate how in practice one can efficiently use thecombination of line-search and primal-duality by considering a convex optimizationproblem with a simple structure (for example, linearly constrained)
    corecore